Graduate Record Examinations

The Graduate Record Examinations (GRE) is a standardized test that is an admissions requirement for many graduate schools in the United States,[1] in other English-speaking countries and for English-taught graduate and business programs world-wide. Created and administered by Educational Testing Service (ETS) in 1949,[2] the exam aims to measure verbal reasoning, quantitative reasoning, analytical writing and critical thinking skills that have been acquired over a long period of time and that are not related to any specific field of study. The GRE General Test is offered as a computer-based, computer adaptive exam administered by selected qualified testing centers; however, paper-based exams are offered in areas of the world where computer-based testing is not available.

In the graduate school admissions process, the level of emphasis that is placed upon GRE scores varies widely between schools and between departments within schools. The importance of a GRE score can range from being a mere admission formality to an important selection factor.

Critics of the GRE have argued that the exam format is so rigid that it effectively tests only how well a student can conform to a standardized test taking procedure.[3] ETS responded by announcing plans in 2006 to radically redesign the test structure starting in the fall of 2007; however, the company later announced, "Plans for launching an entirely new test all at once were dropped, and ETS decided to introduce new question types and improvements gradually over time." The new questions have been gradually introduced since November 2007.[4] The GRE has been revised as of August 2011.

The cost to take the test varies between US$130 and $210, depending on the country in which it is taken, although ETS will reduce the fee under certain circumstances. They are promoting financial aid to those GRE applicants who prove economic hardship.[5] ETS erases all test records that are older than 5 years, although graduate program policies on the admittance of scores older than 5 years will vary.

Contents

Structure

The computer-based GRE General Test consists of six sections. The first section is always the analytical writing section involving separately timed issue and argument tasks. The next five sections consist of two verbal sections, two quantitative sections, and either an experimental or research section. These five sections may occur in any order. The experimental section does not count towards the final score but is not distinguished from the scored sections. The GRE is section-level adaptive in that the computer selects the second section based on the performance on the first section. Unlike on the computer adaptive test prior to August 2011, the examinee is free to skip back and forth within sections. The entire testing procedure lasts about 3 hours 45 minutes. [6] One minute breaks are offered after each section and a 10 minute break after the third section.

The paper-based GRE General Test consists of six sections and is only available in areas where computer-based testing is unavailable. The analytical writing is split up into two sections, one section for each issue and argument task. The next four sections consist of two verbal and two quantitative sections in varying order. There is no experimental section on the paper-based test.

Verbal section

The computer-based verbal sections assess reading comprehension, critical reasoning, and vocabulary usage. The verbal test is scored on a scale of 130-170, in 1-point increments (Before August 2011 the scale was 200–800, in 10-point increments). In a typical examination, each verbal section consists of 20 questions to be completed in 30 minutes. [7] Each verbal section consists of about 6 text completion, 4 sentence equivalence, and 10 critical reading questions. The changes in 2011 include a reduced emphasis on rote vocabulary knowledge and the elimination of antonyms and analogies. Text completion items have replaced sentence completions and new reading question types allowing for the selection of multiple answers were added.

Quantitative section

The computer-based quantitative sections assess basic high school level mathematical knowledge and reasoning skills. The quantitative test is scored on a scale of 130–170, in 1-point increments (Before August 2011 the scale was 200–800, in 10-point increments). In a typical examination, each quantitative section consists of 20 questions to be completed in 35 minutes.[8] Each quantitative section consists of about 8 quantitative comparisons, 9 problem solving items, and 3 data interpretation questions. The changes in 2011 include the addition of numeric entry items requiring the examinee to fill in a blank and multiple-choice items requiring the examinee to select multiple correct responses. [9]

Analytical writing section

The analytical writing section consists of two different essays, an "issue task" and an "argument task". The writing section is graded on a scale of 0-6, in half-point increments. The essays are written on a computer using a word processing program specifically designed by ETS. The program allows only basic computer functions and does not contain a spell-checker or other advanced features. Each essay is scored by at least two readers on a six-point holistic scale. If the two scores are within one point, the average of the scores is taken. If the two scores differ by more than a point, a third reader examines the response.

Issue task

The test taker is given a topic upon which to write an essay. The time allowed for this essay is 30 minutes if taken after August 1 2011. [10] Issue topics are selected from a pool of questions.[11]

Argument task

The test taker will be given an "argument" and will be asked to write an essay that critiques the argument. Test takers are asked to consider the argument's logic and to make suggestions about how to improve the logic of the argument. Test takers are expected to address the logical flaws of the argument, not to provide a personal opinion on the subject. The time allotted for this essay is 30 minutes.[12] Arguments are selected from a pool of topics.[13]

Experimental section

The experimental section, which can be either a verbal, quantitative, or analytical writing task, contains new questions ETS is considering for future use. Although the experimental section does not count towards the test-taker's score, it is unidentified and appears identical to the scored sections. Because test takers have no definite way of knowing which section is experimental, it is typically advised that test takers try their best on every section. Sometimes an identified research section at the end of the test is given instead of the experimental section. [14] There is no experimental section on the paper-based GRE.

Scoring

Computerized adaptive testing

The common (Verbal and Quantitative) multiple-choice portions of the exam currently uses a section-based computer-adaptive testing (CAT) format that automatically changes the difficulty of the sections as the test taker proceeds with the exam, depending on the number of correct or incorrect answers that are given.[15] Currently, the GRE revised General Test allows the test-taker to move forth and back, and change answers within a section, and thus the questions in a given section are not adaptive.

While questions within each section are not adaptive, performance on the first Verbal and Quantitative section influences the difficulty in the second section of the same topic. This approach to administration yields scores that are of similar accuracy while using approximately half as many items.[16] However, this effect is moderated with the GRE because it has a fixed length; true CATs are variable length, where the test will stop itself once it has zeroed in on a candidate's ability level.

The actual scoring of the test is done with item response theory (IRT). While CAT is associated with IRT, IRT is actually used to score non-CAT exams. The GRE subject tests, which are administered in the traditional paper-and-pencil format, use the same IRT scoring algorithm. The difference that CAT provides is that items are dynamically selected so that the test taker only sees items of appropriate difficulty. Besides the psychometric benefits, this has the added benefit of not wasting the examinee's time by administering items that are far too hard or easy, which occurs in fixed-form testing.

An examinee can miss one or more questions on a multiple-choice section and still receive a perfect score of 170. Likewise, even if no question is answered correctly, 130 is the lowest possible score.[17]

Scaled score percentiles

The percentiles for the current revised General test are as follows. Means and standard deviations for the measures on the new score scale are not yet available: [18]

Scaled score Verbal Reasoning % Quantitative Reasoning %
170 99 99
169 99 98
168 98 96
167 98 95
166 97 94
165 96 93
164 94 91
163 93 88
162 90 87
161 89 86
160 86 84
159 84 82
158 79 79
157 77 77
156 72 74
155 69 69
154 64 67
153 62 65
152 56 61
151 51 56
150 48 53
149 42 49
148 40 44
147 36 40
146 31 36
145 28 32
144 26 26
143 21 22
142 18 19
141 16 16
140 13 12
139 10 10
138 8 7
137 6 6
136 5 4
135 4 3
134 3 2
133 2 1
132 1 1
131 1 1
130 1 1
Analytical Writing score Writing % Below
6 99
5.5 96
5 87
4.5 72
4 48
3.5 29
3 11
2.5 4
2 1
1.5 1
1 1
0.5 1

The percentiles for the former test are as follows:

Scaled score Verbal Reasoning % Quantitative Reasoning %
800 99 94
780 99 89
760 99 84
740 99 80
720 98 75
700 97 70
680 96 66
660 94 61
640 92 57
620 89 52
600 86 47
580 85 43
560 77 38
540 72 34
520 67 30
500 62 27
480 57 23
460 52 20
440 46 17
420 40 15
400 35 12
380 29 10
360 24 8
340 19 7
320 13 5
300 8 4
280 5 3
260 2 2
240 1 1
220 0 1
200 0 0
mean 456 590
standard deviation 120 150
Analytical Writing score Writing % Below
6 99
5.5 94
5 84
4.5 67
4 45
3.5 26
3 10
2.5 4
2 1
1.5 0
1 0
0.5 0
mean 3.8
standard deviation 0.9

Comparisons for "Intended Graduate Major" are "limited to those who earned their college degrees up to two years prior to the test date." ETS provides no score data for "non-traditional" students who have been out of school more than two years, although its own report "RR-99-16" indicated that 22% of all test takers in 1996 were over the age of 30.

Use in admissions

Many graduate schools in English-speaking countries (especially in the United States) require GRE results as part of the admissions process. The GRE is a standardized test intended to measure the abilities of all graduates in tasks of general academic nature, regardless of their fields of specialization. The GRE is supposed to measure the extent to which undergraduate education has developed an individual's verbal and quantitative skills in abstract thinking.

Unlike other standardized admissions tests (such as the SAT, LSAT, and MCAT), the use and weight of GRE scores vary considerably not only from school to school, but from department to department, and from program to program also. Programs in liberal arts topics may only consider the applicant's verbal score to be of interest, while mathematics and science programs may only consider quantitative ability; however, since most applicants to mathematics, science, or engineering graduate programs all have high quantitative scores, the verbal score can become a deciding factor even in these programs. Admission to graduate schools depends on a complex mix of several different factors. Schools see letters of recommendation, statement of purpose, GPA, GRE score etc. Some schools use the GRE in admissions decisions, but not in funding decisions; others use the GRE for the selection of scholarship and fellowship candidates, but not for admissions. In some cases, the GRE may be a general requirement for graduate admissions imposed by the university, while particular departments may not consider the scores at all. Graduate schools will typically provide information about how the GRE is considered in admissions and funding decisions, and the average scores of previously admitted students. The best way to find out how a particular school or program evaluates a GRE score in the admissions process is to contact the person in charge of graduate admissions for the specific program in question (and not the graduate school in general).

Programs that involve significant expository writing require the submission of a prepared writing sample that is considered more useful in determining writing ability than the analytical writing section; however, the writing scores of foreign students are sometimes given more scrutiny and are used as an indicator of overall comfort with and mastery of conversational English.

GRE Subject Tests

In addition to the General Test, there are also eight GRE Subject Tests testing knowledge in the specific areas of Biochemistry, Cell and Molecular Biology; Biology; Chemistry; Computer Science; Literature in English; Mathematics; Physics; and Psychology. The length of each exam is 170 minutes.

In the past, subject tests were also offered in the areas of Economics, Revised Education, Engineering, Geology, History, Music, Political Science, and Sociology. In April 1998, the Revised Education and Political Science exams were discontinued. In April 2000, the History and Sociology exams were discontinued, and the other four were discontinued in April 2001.[19]

GRE and GMAT

The GMAT (Graduate Management Admission Test) is a computer adaptive standardized test in mathematics and the English language for measuring aptitude to succeed academically in graduate business studies. Business schools commonly use the test as one of many selection criteria for admission into an MBA program. However, there are many business schools that also accept GRE scores.

The following are criteria of certain business schools:

In comparison with GMAT's emphasis on logic, GRE measures the test-takers' ability more in vocabulary. This difference is reflected in the structure of each test. Despite the Analytical Writing section in common, GRE has analogies, antonyms, sentence completions, and reading comprehension passages in Verbal section, while GMAT has sentence correction, critical reasoning and reading comprehension.

Also, higher mathematical ability is required on the GMAT to get a good score. The GRE is more appealing to international MBA students and applicants from a non-traditional background.[24]

Preparation

A variety of resources are available for those wishing to prepare for the GRE. Upon registration, ETS provides preparation software called PowerPrep, which contains two practice tests of retired questions, as well as further practice questions and review material. Since the software replicates both the test format and the questions used, it can be useful to predict the actual GRE scores. ETS does not license their past questions to any other company, making them the only source for official retired material. ETS used to publish the "BIG BOOK" which contained a number of actual GRE questions; however, this publishing was abandoned. Several companies provide courses, books, and other unofficial preparation materials.

ETS has claimed that content of the GRE is un-coachable; however, test preparation companies such as Kaplan and Princeton Review claim that the test format is so rigid that familiarizing oneself with the test's organization, timing, specific foci, and the use of process of elimination is the best way to increase a GRE score.[25]

Testing locations

While the general and subject tests are held at many undergraduate institutions, the computer-based general test is only held at test centers with appropriate technological accommodations. Students in major cities in the United States, or those attending large U.S. universities, will usually find a nearby test center, while those in more isolated areas may have to travel a few hours to an urban or university location. Many industrialized countries also have test centers, but at times test-takers must cross country borders.

Validity

An analysis of the GRE's validity in predicting graduate school success found a correlation of .30 to .45 between the GRE and both first year and overall graduate GPA. The correlation between GRE score and graduate school completion rates ranged from .11 (for the now defunct analytical section) to .39 (for the GRE subject test). Correlations with faculty ratings ranged from .35 to .50.[26]

Criticism

Bias

Critics have claimed that the computer-adaptive methodology may discourage some test takers, because the question difficulty changes with performance. For example, if the test-taker is presented with remarkably easy questions half way into the exam, they may infer that they are not performing well, which will influence their abilities as the exam continues, even though question difficulty is subjective. By contrast standard testing methods may discourage students by giving them more difficult items earlier on.

Critics have also stated that the computer-adaptive method of placing more weight on the first several questions is biased against test takers who typically perform poorly at the beginning of a test due to stress or confusion before becoming more comfortable as the exam continues.[27] On the other hand, standard fixed-form tests could equally be said to be "biased" against students with less testing stamina since they would need to be approximately twice the length of an equivalent computer adaptive test to obtain a similar level of precision.[16]

The GRE has also been subjected to the same racial bias criticisms that have been lodged against other admissions tests. In 1998, the Journal of Blacks in Higher Education noted that the mean score for black test-takers in 1996 was 389 on the verbal section, 409 on the quantitative section, and 423 on the analytic, while white test-takers averaged 496, 538, and 564, respectively.[28] The National Association of Test Directors Symposia in 2004 stated a belief that simple mean score differences may not constitute evidence of bias unless the populations are known to be equal in ability.[29] A more effective, accepted, and empirical approach is the analysis of differential test functioning, which examines the differences in item response theory curves for subgroups; the best approach for this is the DFIT framework.[30]

Weak predictor of graduate school performance

The GREs are criticized for not being a true measure of whether a student will be successful in graduate school. Robert Sternberg (now of Oklahoma State University–Stillwater; working at Yale University at the time of the study), a long-time critic of modern intelligence testing in general, found the GRE general test was weakly predictive of success in graduate studies in psychology.[31] The strongest relationship was found for the now-defunct analytical portion of the exam.

The ETS published a report ("What is the Value of the GRE?") that points out the predictive value of the GRE on a student's index of success at the graduate level.[32] The problem with earlier studies is the statistical phenomena of restriction of range. A correlation coefficient is sensitive to the range sampled for the test. Specifically, if only students accepted to graduate programs are studied (in Sternberg & Williams and other research), the relationship is occluded. Validity coefficients range from .30 to .45 between the GRE and both first year and overall graduate GPA in ETS' study.[26]

Historical susceptibility to cheating

In May 1994, Kaplan, Inc warned ETS, in hearings before a New York legislative committee, that the small question pool available to the computer-adaptive test made it vulnerable to cheating. ETS assured investigators that it was using multiple sets of questions and that the test was secure. This was later discovered to be incorrect.[33]

In December 1994, prompted by student reports of recycled questions, then Director of GRE Programs for Kaplan, Inc and current CEO of Knewton, Jose Ferreira led a team of 22 staff members deployed to 9 U.S. cities to take the exam. Kaplan, Inc then presented ETS with 150 questions, representing 70-80% of the GRE.[34] According to early news releases, ETS appeared grateful to Stanley H. Kaplan, Inc for identifying the security problem. However, on December 31, ETS sued Kaplan, Inc for violation of a federal electronic communications privacy act, copyright laws, breach of contract, fraud, and a confidentiality agreement signed by test-takers on test day.[35] On January 2, 1995, an agreement was reached out of court.

Additionally, in 1994, the scoring algorithm for the computer-adaptive form of the GRE was discovered to be insecure. ETS acknowledged that Kaplan, Inc employees, led by Jose Ferreira, reverse-engineered key features of the GRE scoring algorithms. The researchers found that a test taker’s performance on the first few questions of the exam had a disproportionate effect on the test taker’s final score. To preserve the integrity of scores, ETS revised its scoring and uses a more sophisticated scoring algorithm.

2011 Revision of the GRE

In 2006, ETS announced plans to enact significant changes in the format of the GRE. Planned changes for the revised GRE included a longer testing time, a departure from computer-adaptive testing, a new grading scale, and an enhanced focus on reasoning skills and critical thinking for both the quantitative and qualitative sections.[36]

On April 2, 2007, ETS announced the decision to cancel plans for revising the GRE.[37] The announcement cited concerns over the ability to provide clear and equal access to the new test after the planned changes as an explanation for the cancellation. The ETS stated, however, that they do plan "to implement many of the planned test content improvements in the future", although specific details regarding those changes have not yet been announced.

Changes to the GRE took effect on November 1, 2007, as ETS started to include new types of questions in the exam. The changes mostly center on "fill in the blank" type answers for both the mathematics and vocabulary sections that require the test-taker to fill in the blank directly, without being able to choose from a multiple choice list of answers. ETS currently plans to introduce two of these new types of questions in each quantitative or vocabulary section, while the majority of questions will be presented in the regular format.[38]

Since January 2008, the Reading Comprehension within the verbal sections has been reformatted, passages' "line numbers will be replaced with highlighting when necessary in order to focus the test taker on specific information in the passage" to "help students more easily find the pertinent information in reading passages."[39]

In December 2009, ETS announced plans to move forward with significant revisions to the GRE in 2011.[40] Changes include a new 130-170 scoring scale, the elimination of certain question types such as antonyms and analogies, the addition of an online calculator, and the elimination of the CAT format of question-by-question adjustment, in favor of a section by section adjustment.[41] The Revised GRE General test replaced General GRE test on August 1, 2011. The revised GRE is said to be better by design and gives better test taking experience. The new types of questions in the revised pattern are supposed to test the skills needed in graduate and business schools programs.[42]

GRE prior to October 2002

The earliest versions of the GRE tested only for verbal and quantitative ability. For a number of years prior to October 2002, the GRE had a separate Analytical Ability section which tested candidates on logical and analytical reasoning abilities. This section was replaced by the Analytical Writing portion.

See also

References

  1. ^ GRE Registration and Information Bulletin
  2. ^ Alternative Admissions and Scholarship Selection Measures in Higher Education.
  3. ^ Princeton Review, Cracking the GRE, 2007 edition p. 19 # ISBN 0-375-76551-4, ISBN 978-0-375-76551-3. 2006
  4. ^ GRE General Test to Include New Questions
  5. ^ MBA Channel: "GRE:Wharton joins the club" 31 July 2009
  6. ^ GRE Test Content
  7. ^ GRE Test Content
  8. ^ GRE Test Content
  9. ^ Weiner-Green, Sharon; Wolf, Ira K (2009), Barron's How to Prepare for the GRE (17 ed.), Barron's Educational Series, p. 9, ISBN 0764174711 
  10. ^ GRE Revised Analytical Writing
  11. ^ The Pool of Issue Topics
  12. ^ GRE Test Content
  13. ^ The Pool of Argument Topics
  14. ^ GRE Test Content
  15. ^ http://www.ets.org/gre/revised_general/about/content/cbt/
  16. ^ a b Weiss, D. J.; Kingsbury, G. G. (1984). "Application of computerized adaptive testing to educational problems". Journal of Educational Measurement 21 (4): 361–375. doi:10.1111/j.1745-3984.1984.tb01040.x. 
  17. ^ http://www.ets.org/gre/revised_general/scores/
  18. ^ http://www.ets.org/s/gre/pdf/gre_guide.pdf Guide to the Use of Scores 2011-2012
  19. ^ http://www.physicsgre.com/engineering-gre.shtml
  20. ^ HBS: "Admissions Requirements" 29 December 2010
  21. ^ Darden: " Application Requirements" 29 December 2010
  22. ^ Sloan: "Application Instructions" 29 December 2010
  23. ^ "Aptitude and English Tests: MBA Program: Stanford GSB". Stanford Graduate School of Business. http://www.gsb.stanford.edu/mba/admission/test_scores.html#gmat_gre. 
  24. ^ MBA Channel: "GRE: Wharton joins the club" 31 July 2009
  25. ^ Princeton Review, Cracking the GRE, 2007 edition p. 19 # ISBN 0-375-76551-4, ISBN 978-0-375-76551-3. 2006
  26. ^ a b Kuncel, N. R.; Hezlett, S. A.; Ones, D. S. (2001). "A comprehensive meta-analysis of the predictive validity of the Graduate Record Examinations: Implications for graduate student selection and performance". Psychological Bulletin 127 (1): 162–181. http://web.uvic.ca/psyc/lindsay/teaching/499/readings/kuncel.pdf. 
  27. ^ "Testing service cancels February GRE". http://www.dailybruin.ucla.edu/archives/id/4597/. 
  28. ^ "Estimating the Effect a Ban on Racial Preferences Would Have on African- American Admissions to the Nation's Leading Graduate Schools". The Journal of Blacks in Higher Education 19: 80–82. 1998. JSTOR 2998926. 
  29. ^ The Achievement Gap: Test Bias or School Structures? National Association of Test Directors 2004 Symposia [1]
  30. ^ Oshima, T. C.; Morris, S. B. (2008). "Raju's Differential Functioning of Items and Tests (DFIT)". Educational Measurement: Issues and Practice 27 (3): 43–50. 
  31. ^ Sternberg, R. J.; Williams, W. M. (1997). "Does the Graduate Record Examinations predict meaningful success in the graduate training of psychology? A case study". American Psychologist 52: 630–641. http://www.news.cornell.edu/releases/Aug97/GRE.study.ssl.html. 
  32. ^ http://www.ets.org/Media/Tests/GRE/pdf/grevalue.pdf
  33. ^ Frantz, Douglas; Nordheimer, Jon (September 28, 1997). "Giant of Exam Business Keeps Quiet on Cheating". The New York Times. http://www.nytimes.com/1997/09/28/us/giant-of-exam-business-keeps-quiet-on-cheating.html?sec=&spon=&pagewanted=all. Retrieved April 2, 2010. 
  34. ^ "Computer Admissions Test Found to Be Ripe for Abuse". The New York Times. December 16, 1994. http://www.nytimes.com/1994/12/16/us/computer-admissions-test-found-to-be-ripe-for-abuse.html?scp=1&sq=Ripe%20for%20abuse&st=cse. Retrieved April 2, 2010. 
  35. ^ Boxall, Bettina (January 1, 1995). "Educational Testing Service Sues Exam-Coaching Firm". Los Angeles Times. http://articles.latimes.com/1995-01-01/news/mn-15369_1_educational-testing-service. Retrieved May 4, 2010. 
  36. ^ Comparison Chart of GRE Changes
  37. ^ Plans for the Revised GRE Cancelled
  38. ^ GRE General Test to Include New Question Types in November
  39. ^ Revisions to the Computer-based GRE General Test in 2008 at the Wayback Machine (archived August 22, 2008)
  40. ^ Revised General Test
  41. ^ A New Look for Graduate Entrance Test
  42. ^ Revised GRE FAQs

External links